Optimized Binary Patterns by Gradient Descent for Ghost Imaging

نویسندگان

چکیده

Ghost imaging reconstructs images using a single-element photodetector; it performs by illuminating an object with binary modulation patterns. This technique has various advantages, including wide wavelength, noise robustness, and high measurement sensitivity. However, one challenge is the low image quality in undersampling. The examination of patterns intended to solve this issue. In ghost imaging, randomly generated or basis have been studied as patterns; however, undersampling, random exhibit robustness but quality, whereas are sensitive resolution. Thus, requires that simultaneously achieve resolution, robustness. study proposes method pattern optimization gradient descent binarization further improve quality. Numerical simulation experimental results show proposed approach offers noise.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Handwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns

The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...

متن کامل

Learning to learn by gradient descent by gradient descent

The move from hand-designed features to learned features in machine learning has been wildly successful. In spite of this, optimization algorithms are still designed by hand. In this paper we show how the design of an optimization algorithm can be cast as a learning problem, allowing the algorithm to learn to exploit structure in the problems of interest in an automatic way. Our learned algorit...

متن کامل

Learning to Learn without Gradient Descent by Gradient Descent

We learn recurrent neural network optimizers trained on simple synthetic functions by gradient descent. We show that these learned optimizers exhibit a remarkable degree of transfer in that they can be used to efficiently optimize a broad range of derivative-free black-box functions, including Gaussian process bandits, simple control objectives, global optimization benchmarks and hyper-paramete...

متن کامل

Learning by Online Gradient Descent

We study online gradient{descent learning in multilayer networks analytically and numerically. The training is based on randomly drawn inputs and their corresponding outputs as deened by a target rule. In the thermo-dynamic limit we derive deterministic diierential equations for the order parameters of the problem which allow an exact calculation of the evolution of the generalization error. Fi...

متن کامل

Fast Binary Compressive Sensing via \ell_0 Gradient Descent

We present a fast Compressive Sensing algorithm for the reconstruction of binary signals {0, 1}-valued binary signals from its linear measurements. The proposed algorithm minimizes a non-convex penalty function that is given by a weighted sum of smoothed l0 norms, under the [0, 1] box-constraint. It is experimentally shown that the proposed algorithm is not only significantly faster than linear...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2021

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2021.3094576